Deep Learning Online Free Workshop

1.png

In 2006 was the discovery of techniques for learning in so-called deep neural networks. These techniques are now known as deep learning. They've been developed further, and today deep neural networks and deep learning achieve outstanding performance on many important problems in computer vision, speech recognition, and natural language processing!

“In a neural network we don't tell the computer how to solve our problem. Instead, it learns from observational data, figuring out its own solution to the problem at hand.”

Applications

  1. Face Recognition
  2. Image Classification
  3. Speech Recognition
  4. Text-to-speech Generation
  5. Handwriting Transcription
  6. Machine Translation
  7. Medical Diagnosis
  8. Self Driving Cars
  9. Digital Assistants
  10. Ads, search, social recommendations

History In Brief

  1. 1943: Portrayed with a simple electrical circuit by neurophysiologist Warren McCulloch and mathematician Walter Pitt
  2. 1950-1960: Perceptrons were developed by the scientist Frank Rosenblatt! image.png
  3. 1974-86: Backpropagation Algorithm, Recurrent NL
  4. 1989-98: Convolutional Neural Networks, Bi Directional RNN, Long Short Term Memory (LSTM), MNIST Data Set
  5. 2006: “Deep Learning” Concept
  6. 2013 - 2014: Genarative Adverserial Networks / Deep Q Nets

Picture1.png

Algorithm (Neural Network Types)

Picture3.png

Today's Focus

Picture2.png

Supervised Learning Model

w.png

Pictures1.png

Classification & Regression

  1. Classification: “Classification" indicates that the data has discrete class label. Classification predictive modeling is the task of approximating a mapping function (f) from input variables (X) to discrete output variables (y) or classes. The output variables are often called labels or categories. The mapping function predicts the class or category for a given observation

  2. Regression: Regression predictive modeling is the task of approximating a mapping function (f) from input variables (X) to a continuous output variable (y). A continuous output variable is a real-value, such as an integer or floating point value. These are often quantities, such as amounts and sizes. For example, a house may be predicted to sell for a specific dollar value, perhaps in the range of $100,000 to $200,000.

Feed Forward neural Networks and Atrchitecture

Picturess1.png

How to train a FFNN

Lets take an example problem- Iris Flower example

The Iris flower data set or Fisher's Iris data set is a multivariate data set introduced by the British statistician, eugenicist, and biologist Ronald Fisher in his 1936 paper The use of multiple measurements in taxonomic problems as an example of linear discriminant analysis. It is sometimes called Anderson's Iris data set because Edgar Anderson collected the data to quantify the morphologic variation of Iris flowers of three related species. Two of the three species were collected in the Gaspé Peninsula "all from the same pasture, and picked on the same day and measured at the same time by the same person with the same apparatus". Fisher's paper was published in the journal, the Annals of Eugenics, creating controversy about the continued use of the Iris dataset for teaching statistical techniques today.

The data set consists of 50 samples from each of three species of Iris (Iris setosa, Iris virginica and Iris versicolor). Four features were measured from each sample: the length and the width of the sepals and petals, in centimeters. Based on the combination of these four features, Fisher developed a linear discriminant model to distinguish the species from each other.

labels

1_7bnLKsChXq94QjtAiRn40w.png

Features

versicolor.jpg

Dataset

(https://en.wikipedia.org/wiki/Iris_flower_data_set)

Neural Network Architecture

Untitled%20Diagram.png

Training the FFNN

Untitled%20Diagsram.png

Weights and Biases

wbs.png

sas.png

Pictuerere1.png

lossf.png

opt.png

Our Problem

Training, Evaluating and Testing a FFNN to Identify Handwritten Digits.

1. Identifying Features and Labels

gray.pngcolors.png

6.png

Train Test Split

7.png

Evaluating the Algorithm

9.png

In [9]:
import pandas as pd

dataset=pd.read_csv('iris.csv').values

data=dataset[:,0:4]
target=dataset[:,4]
In [10]:
from keras.models import Sequential
from keras.layers import Dense

model=Sequential()
#an empty NN created

model.add(Dense(64,input_dim=4,activation='relu'))
model.add(Dense(128,activation='relu'))
model.add(Dense(64,activation='relu'))
model.add(Dense(3,activation='softmax'))

model.compile(loss='categorical_crossentropy',optimizer='sgd',metrics=['accuracy'])
model.summary()
Using TensorFlow backend.
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 64)                320       
_________________________________________________________________
dense_2 (Dense)              (None, 128)               8320      
_________________________________________________________________
dense_3 (Dense)              (None, 64)                8256      
_________________________________________________________________
dense_4 (Dense)              (None, 3)                 195       
=================================================================
Total params: 17,091
Trainable params: 17,091
Non-trainable params: 0
_________________________________________________________________
In [15]:
new_target=[]

for i in target:
    if(i=='setosa'):
        new_target.append(0)
    elif(i=='versicolor'):
        new_target.append(1)
    else:
        new_target.append(2)
In [17]:
from keras.utils import np_utils

new_target=np_utils.to_categorical(new_target)
In [21]:
from sklearn.model_selection import train_test_split

train_data,test_data,train_target,test_target=train_test_split(data,new_target,test_size=0.1)
In [22]:
history=model.fit(train_data,train_target,epochs=100)
Epoch 1/100
135/135 [==============================] - 2s 17ms/step - loss: 1.0354 - accuracy: 0.3407
Epoch 2/100
135/135 [==============================] - 0s 52us/step - loss: 0.9703 - accuracy: 0.4000
Epoch 3/100
135/135 [==============================] - 0s 63us/step - loss: 0.8917 - accuracy: 0.6963
Epoch 4/100
135/135 [==============================] - 0s 59us/step - loss: 0.8625 - accuracy: 0.6815
Epoch 5/100
135/135 [==============================] - 0s 59us/step - loss: 0.7985 - accuracy: 0.8148
Epoch 6/100
135/135 [==============================] - 0s 52us/step - loss: 0.7596 - accuracy: 0.6889
Epoch 7/100
135/135 [==============================] - 0s 44us/step - loss: 0.7353 - accuracy: 0.8000
Epoch 8/100
135/135 [==============================] - 0s 44us/step - loss: 0.6901 - accuracy: 0.7111
Epoch 9/100
135/135 [==============================] - 0s 66us/step - loss: 0.6590 - accuracy: 0.8444
Epoch 10/100
135/135 [==============================] - 0s 37us/step - loss: 0.6158 - accuracy: 0.8741
Epoch 11/100
135/135 [==============================] - 0s 52us/step - loss: 0.5896 - accuracy: 0.7704
Epoch 12/100
135/135 [==============================] - 0s 44us/step - loss: 0.5616 - accuracy: 0.9111
Epoch 13/100
135/135 [==============================] - 0s 56us/step - loss: 0.5385 - accuracy: 0.8889
Epoch 14/100
135/135 [==============================] - 0s 59us/step - loss: 0.5280 - accuracy: 0.8593
Epoch 15/100
135/135 [==============================] - 0s 52us/step - loss: 0.4920 - accuracy: 0.8815
Epoch 16/100
135/135 [==============================] - 0s 44us/step - loss: 0.4945 - accuracy: 0.8815
Epoch 17/100
135/135 [==============================] - 0s 44us/step - loss: 0.4670 - accuracy: 0.9259
Epoch 18/100
135/135 [==============================] - 0s 66us/step - loss: 0.4478 - accuracy: 0.8815
Epoch 19/100
135/135 [==============================] - 0s 52us/step - loss: 0.4286 - accuracy: 0.8889
Epoch 20/100
135/135 [==============================] - 0s 44us/step - loss: 0.4275 - accuracy: 0.8593
Epoch 21/100
135/135 [==============================] - 0s 52us/step - loss: 0.4167 - accuracy: 0.9185
Epoch 22/100
135/135 [==============================] - 0s 52us/step - loss: 0.4099 - accuracy: 0.9111
Epoch 23/100
135/135 [==============================] - 0s 52us/step - loss: 0.3976 - accuracy: 0.8963
Epoch 24/100
135/135 [==============================] - 0s 133us/step - loss: 0.3921 - accuracy: 0.8889
Epoch 25/100
135/135 [==============================] - 0s 66us/step - loss: 0.3915 - accuracy: 0.8593
Epoch 26/100
135/135 [==============================] - 0s 59us/step - loss: 0.3563 - accuracy: 0.8889
Epoch 27/100
135/135 [==============================] - 0s 81us/step - loss: 0.3423 - accuracy: 0.9259
Epoch 28/100
135/135 [==============================] - 0s 103us/step - loss: 0.3519 - accuracy: 0.8889
Epoch 29/100
135/135 [==============================] - 0s 89us/step - loss: 0.3513 - accuracy: 0.8963
Epoch 30/100
135/135 [==============================] - 0s 59us/step - loss: 0.3349 - accuracy: 0.9111
Epoch 31/100
135/135 [==============================] - 0s 52us/step - loss: 0.3121 - accuracy: 0.9630
Epoch 32/100
135/135 [==============================] - 0s 53us/step - loss: 0.3051 - accuracy: 0.9556
Epoch 33/100
135/135 [==============================] - 0s 41us/step - loss: 0.2953 - accuracy: 0.9481
Epoch 34/100
135/135 [==============================] - 0s 44us/step - loss: 0.2901 - accuracy: 0.9333
Epoch 35/100
135/135 [==============================] - 0s 37us/step - loss: 0.2858 - accuracy: 0.9778
Epoch 36/100
135/135 [==============================] - 0s 66us/step - loss: 0.2793 - accuracy: 0.9704
Epoch 37/100
135/135 [==============================] - 0s 44us/step - loss: 0.2784 - accuracy: 0.9407
Epoch 38/100
135/135 [==============================] - 0s 44us/step - loss: 0.2644 - accuracy: 0.9407
Epoch 39/100
135/135 [==============================] - 0s 44us/step - loss: 0.2783 - accuracy: 0.9111
Epoch 40/100
135/135 [==============================] - 0s 52us/step - loss: 0.2599 - accuracy: 0.9556
Epoch 41/100
135/135 [==============================] - 0s 37us/step - loss: 0.2791 - accuracy: 0.9185
Epoch 42/100
135/135 [==============================] - 0s 44us/step - loss: 0.2731 - accuracy: 0.9185
Epoch 43/100
135/135 [==============================] - 0s 37us/step - loss: 0.2798 - accuracy: 0.9259
Epoch 44/100
135/135 [==============================] - 0s 52us/step - loss: 0.2394 - accuracy: 0.9704
Epoch 45/100
135/135 [==============================] - 0s 37us/step - loss: 0.2305 - accuracy: 0.9704
Epoch 46/100
135/135 [==============================] - 0s 81us/step - loss: 0.2909 - accuracy: 0.8741
Epoch 47/100
135/135 [==============================] - 0s 37us/step - loss: 0.2279 - accuracy: 0.9630
Epoch 48/100
135/135 [==============================] - 0s 44us/step - loss: 0.2328 - accuracy: 0.9185
Epoch 49/100
135/135 [==============================] - 0s 37us/step - loss: 0.2529 - accuracy: 0.9259
Epoch 50/100
135/135 [==============================] - 0s 44us/step - loss: 0.2475 - accuracy: 0.9333
Epoch 51/100
135/135 [==============================] - 0s 37us/step - loss: 0.2284 - accuracy: 0.9630
Epoch 52/100
135/135 [==============================] - 0s 52us/step - loss: 0.2309 - accuracy: 0.9481
Epoch 53/100
135/135 [==============================] - 0s 52us/step - loss: 0.2140 - accuracy: 0.9556
Epoch 54/100
135/135 [==============================] - 0s 52us/step - loss: 0.2126 - accuracy: 0.9407
Epoch 55/100
135/135 [==============================] - 0s 44us/step - loss: 0.2565 - accuracy: 0.9185
Epoch 56/100
135/135 [==============================] - 0s 44us/step - loss: 0.2942 - accuracy: 0.8444
Epoch 57/100
135/135 [==============================] - 0s 44us/step - loss: 0.1960 - accuracy: 0.9704
Epoch 58/100
135/135 [==============================] - 0s 44us/step - loss: 0.2318 - accuracy: 0.9111
Epoch 59/100
135/135 [==============================] - 0s 37us/step - loss: 0.1815 - accuracy: 0.9704
Epoch 60/100
135/135 [==============================] - 0s 37us/step - loss: 0.2195 - accuracy: 0.9037
Epoch 61/100
135/135 [==============================] - 0s 96us/step - loss: 0.1747 - accuracy: 0.9778
Epoch 62/100
135/135 [==============================] - 0s 37us/step - loss: 0.2165 - accuracy: 0.9333
Epoch 63/100
135/135 [==============================] - 0s 37us/step - loss: 0.1723 - accuracy: 0.9704
Epoch 64/100
135/135 [==============================] - 0s 37us/step - loss: 0.1680 - accuracy: 0.9778
Epoch 65/100
135/135 [==============================] - 0s 44us/step - loss: 0.1707 - accuracy: 0.9556
Epoch 66/100
135/135 [==============================] - 0s 37us/step - loss: 0.1662 - accuracy: 0.9630
Epoch 67/100
135/135 [==============================] - 0s 37us/step - loss: 0.1683 - accuracy: 0.9407
Epoch 68/100
135/135 [==============================] - 0s 37us/step - loss: 0.1599 - accuracy: 0.9556
Epoch 69/100
135/135 [==============================] - 0s 44us/step - loss: 0.1956 - accuracy: 0.9259
Epoch 70/100
135/135 [==============================] - 0s 59us/step - loss: 0.2316 - accuracy: 0.9037
Epoch 71/100
135/135 [==============================] - 0s 44us/step - loss: 0.1746 - accuracy: 0.9630
Epoch 72/100
135/135 [==============================] - 0s 37us/step - loss: 0.1634 - accuracy: 0.9630
Epoch 73/100
135/135 [==============================] - 0s 44us/step - loss: 0.1652 - accuracy: 0.9704
Epoch 74/100
135/135 [==============================] - 0s 37us/step - loss: 0.1544 - accuracy: 0.9704
Epoch 75/100
135/135 [==============================] - 0s 37us/step - loss: 0.1866 - accuracy: 0.9259
Epoch 76/100
135/135 [==============================] - 0s 37us/step - loss: 0.1465 - accuracy: 0.9778
Epoch 77/100
135/135 [==============================] - 0s 44us/step - loss: 0.1876 - accuracy: 0.9407
Epoch 78/100
135/135 [==============================] - 0s 44us/step - loss: 0.1595 - accuracy: 0.9630
Epoch 79/100
135/135 [==============================] - 0s 37us/step - loss: 0.1397 - accuracy: 0.9704
Epoch 80/100
135/135 [==============================] - 0s 52us/step - loss: 0.1277 - accuracy: 0.9704
Epoch 81/100
135/135 [==============================] - 0s 37us/step - loss: 0.1617 - accuracy: 0.9407
Epoch 82/100
135/135 [==============================] - 0s 37us/step - loss: 0.1737 - accuracy: 0.9407
Epoch 83/100
135/135 [==============================] - 0s 44us/step - loss: 0.1706 - accuracy: 0.9481
Epoch 84/100
135/135 [==============================] - 0s 44us/step - loss: 0.1468 - accuracy: 0.9481
Epoch 85/100
135/135 [==============================] - 0s 44us/step - loss: 0.1439 - accuracy: 0.9778
Epoch 86/100
135/135 [==============================] - 0s 37us/step - loss: 0.1543 - accuracy: 0.9481
Epoch 87/100
135/135 [==============================] - 0s 37us/step - loss: 0.1835 - accuracy: 0.9407
Epoch 88/100
135/135 [==============================] - 0s 44us/step - loss: 0.2042 - accuracy: 0.9111
Epoch 89/100
135/135 [==============================] - 0s 52us/step - loss: 0.1275 - accuracy: 0.9630
Epoch 90/100
135/135 [==============================] - 0s 37us/step - loss: 0.2415 - accuracy: 0.9111
Epoch 91/100
135/135 [==============================] - 0s 37us/step - loss: 0.1234 - accuracy: 0.9778
Epoch 92/100
135/135 [==============================] - 0s 44us/step - loss: 0.1260 - accuracy: 0.9778
Epoch 93/100
135/135 [==============================] - 0s 44us/step - loss: 0.1498 - accuracy: 0.9704
Epoch 94/100
135/135 [==============================] - 0s 66us/step - loss: 0.1188 - accuracy: 0.9852
Epoch 95/100
135/135 [==============================] - 0s 52us/step - loss: 0.1515 - accuracy: 0.9333
Epoch 96/100
135/135 [==============================] - 0s 52us/step - loss: 0.1467 - accuracy: 0.9333
Epoch 97/100
135/135 [==============================] - 0s 52us/step - loss: 0.1726 - accuracy: 0.9259
Epoch 98/100
135/135 [==============================] - 0s 59us/step - loss: 0.1569 - accuracy: 0.9333
Epoch 99/100
135/135 [==============================] - 0s 44us/step - loss: 0.1270 - accuracy: 0.9778
Epoch 100/100
135/135 [==============================] - 0s 66us/step - loss: 0.1551 - accuracy: 0.9407
In [25]:
from matplotlib import pyplot as plt

plt.plot(history.history['loss'])
Out[25]:
[<matplotlib.lines.Line2D at 0x1780dc801d0>]
In [26]:
plt.plot(history.history['accuracy'])
Out[26]:
[<matplotlib.lines.Line2D at 0x1780dce0eb8>]
In [28]:
predicted_target=model.predict(test_data)
In [29]:
print('Actual results:',test_target)
print('Predicted results:',predicted_target)
Actual results: [[0. 0. 1.]
 [0. 0. 1.]
 [0. 0. 1.]
 [0. 1. 0.]
 [0. 1. 0.]
 [0. 1. 0.]
 [0. 1. 0.]
 [1. 0. 0.]
 [0. 1. 0.]
 [0. 1. 0.]
 [1. 0. 0.]
 [0. 0. 1.]
 [0. 1. 0.]
 [1. 0. 0.]
 [1. 0. 0.]]
Predicted results: [[3.2213600e-05 2.1285828e-02 9.7868198e-01]
 [3.2444492e-05 1.1295879e-02 9.8867166e-01]
 [2.4139554e-04 8.7632306e-02 9.1212624e-01]
 [1.2665496e-02 8.8888484e-01 9.8449662e-02]
 [2.0316190e-03 6.0349965e-01 3.9446872e-01]
 [4.1805715e-03 5.5158216e-01 4.4423723e-01]
 [5.8492199e-03 7.2703004e-01 2.6712072e-01]
 [9.8473465e-01 1.5235600e-02 2.9708812e-05]
 [4.6566452e-04 1.3832299e-01 8.6121130e-01]
 [2.1171826e-03 6.7693561e-01 3.2094720e-01]
 [9.9261332e-01 7.3790080e-03 7.7287505e-06]
 [9.4354555e-06 6.6486262e-03 9.9334198e-01]
 [6.4889323e-03 7.5167340e-01 2.4183770e-01]
 [9.9194169e-01 8.0514029e-03 6.8629593e-06]
 [9.9777538e-01 2.2237170e-03 9.0828280e-07]]
In [30]:
import numpy as np

print('Actual results:',np.argmax(test_target,axis=1))
print('Predicted results:',np.argmax(predicted_target,axis=1))
Actual results: [2 2 2 1 1 1 1 0 1 1 0 2 1 0 0]
Predicted results: [2 2 2 1 1 1 1 0 2 1 0 2 1 0 0]